Skip to content

Add missing channel_wise parameter to RandScaleIntensityFixedMean#8741

Open
engmohamedsalah wants to merge 3 commits intoProject-MONAI:devfrom
engmohamedsalah:fix/add-channel-wise-randscaleintensityfixedmean-8363
Open

Add missing channel_wise parameter to RandScaleIntensityFixedMean#8741
engmohamedsalah wants to merge 3 commits intoProject-MONAI:devfrom
engmohamedsalah:fix/add-channel-wise-randscaleintensityfixedmean-8363

Conversation

@engmohamedsalah
Copy link
Contributor

Summary

  • Adds the channel_wise parameter to RandScaleIntensityFixedMean and RandScaleIntensityFixedMeand, which was documented in docstrings but never implemented.
  • When channel_wise=True, a separate random scale factor is generated per channel, and preserve_range/fixed_mean are applied per channel — following the existing pattern from RandScaleIntensity.
  • Fixes docstring indentation for channel_wise in both array and dictionary transforms.

Fixes #8363

Changes

monai/transforms/intensity/array.py

  • Added channel_wise parameter to RandScaleIntensityFixedMean.__init__
  • Updated randomize() to generate per-channel factors when channel_wise=True
  • Updated __call__ to apply per-channel scaling with individual random factors

monai/transforms/intensity/dictionary.py

  • Added channel_wise parameter to RandScaleIntensityFixedMeand.__init__
  • Updated __call__ to pass image data to randomize() (needed for channel count), following the pattern from RandScaleIntensityd

Tests

  • Added test_channel_wise and test_channel_wise_preserve_range to test_rand_scale_intensity_fixed_mean.py
  • Added test_channel_wise to test_rand_scale_intensity_fixed_meand.py

Test plan

  • All existing tests pass (no regressions)
  • New channel_wise tests pass for both array and dictionary transforms
  • Verified per-channel mean preservation with fixed_mean=True
  • Verified preserve_range clipping works per channel

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 14, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Cache: Disabled due to data retention organization setting

Knowledge base: Disabled due to Reviews -> Disable Knowledge Base setting

📥 Commits

Reviewing files that changed from the base of the PR and between a1f8023 and f996a93.

📒 Files selected for processing (4)
  • monai/transforms/intensity/array.py
  • monai/transforms/intensity/dictionary.py
  • tests/transforms/test_rand_scale_intensity_fixed_mean.py
  • tests/transforms/test_rand_scale_intensity_fixed_meand.py
🚧 Files skipped from review as they are similar to previous changes (1)
  • tests/transforms/test_rand_scale_intensity_fixed_meand.py

📝 Walkthrough

Walkthrough

Adds channel-wise support to RandScaleIntensityFixedMean and its dictionary wrapper. The array transform receives a new channel_wise bool parameter, forwards it to the internal scaler, randomizes per-channel factors when enabled, and applies per-channel scaling by iterating channels, stacking results, and preserving dtype. The dictionary transform propagates channel_wise, adjusts randomization to reuse a key-specific factor, and includes an early tensor-conversion path for empty-key cases. Tests were added to validate per-channel scaling and preserve_range behavior.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Additional notes

Implementation defaults channel_wise=False to preserve backward compatibility. Review should focus on correctness of per-channel factor generation, channel iteration/stacking, dtype/preserve_range handling, and the adjusted dictionary call paths.

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 27.27% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed Title accurately summarizes the main change: implementing the missing channel_wise parameter for RandScaleIntensityFixedMean.
Description check ✅ Passed Description is comprehensive and well-structured, covering changes in multiple files, test additions, and linking to the issue. All key details are present.
Linked Issues check ✅ Passed The PR fully addresses issue #8363 by implementing the documented but missing channel_wise parameter with per-channel factor generation and per-channel application of preserve_range and fixed_mean.
Out of Scope Changes check ✅ Passed All changes are directly scoped to implementing channel_wise support across array and dictionary transforms plus corresponding tests, with no extraneous modifications.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Tip

Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@engmohamedsalah engmohamedsalah force-pushed the fix/add-channel-wise-randscaleintensityfixedmean-8363 branch from 738a3f9 to da88dbe Compare February 23, 2026 14:22
…oject-MONAI#8363)

The channel_wise parameter was documented in the docstring but not actually
accepted by RandScaleIntensityFixedMean or its dictionary variant. This adds
the parameter with per-channel random factor generation and scaling, matching
the existing pattern in RandScaleIntensity. Also fixes docstring indentation.

Signed-off-by: Mohamed Salah <eng.mohamed.tawab@gmail.com>
@engmohamedsalah engmohamedsalah force-pushed the fix/add-channel-wise-randscaleintensityfixedmean-8363 branch from da88dbe to f996a93 Compare February 23, 2026 15:43
Copy link
Member

@ericspod ericspod left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @engmohamedsalah I made a few comments below but it otherwise looks good to me with a few changes. Thanks!

factors: Sequence[float] | float = 0,
fixed_mean: bool = True,
preserve_range: bool = False,
channel_wise: bool = False,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add new arguments to the end of the argument list so that positional arguments aren't affected. I realise the argument was meant to be here given the docstring ordering, but if users call this constructor with positional arguments this will break compatibility.

return None
self.factor = self.R.uniform(low=self.factors[0], high=self.factors[1])
if self.channel_wise:
self.factor = [self.R.uniform(low=self.factors[0], high=self.factors[1]) for _ in range(data.shape[0])] # type: ignore
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.factor = [self.R.uniform(low=self.factors[0], high=self.factors[1]) for _ in range(data.shape[0])] # type: ignore
self.factor = self.R.uniform(low=self.factors[0], high=self.factors[1], size=data.shape[:1]) # type: ignore

This might be slightly better.

Comment on lines +661 to +673
if self.channel_wise:
out = []
for i, d in enumerate(img):
out_channel = ScaleIntensityFixedMean(
factor=self.factor[i], # type: ignore
fixed_mean=self.fixed_mean,
preserve_range=self.preserve_range,
dtype=self.dtype,
)(d[None])[0]
out.append(out_channel)
ret: NdarrayOrTensor = torch.stack(out)
ret = convert_to_dst_type(ret, dst=img, dtype=self.dtype or img.dtype)[0]
return ret
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if self.channel_wise:
out = []
for i, d in enumerate(img):
out_channel = ScaleIntensityFixedMean(
factor=self.factor[i], # type: ignore
fixed_mean=self.fixed_mean,
preserve_range=self.preserve_range,
dtype=self.dtype,
)(d[None])[0]
out.append(out_channel)
ret: NdarrayOrTensor = torch.stack(out)
ret = convert_to_dst_type(ret, dst=img, dtype=self.dtype or img.dtype)[0]
return ret
if self.channel_wise:
out = []
for i, d in enumerate(img):
scale_trans = ScaleIntensityFixedMean(
factor=float(self.factor[i]),
fixed_mean=self.fixed_mean,
preserve_range=self.preserve_range,
dtype=self.dtype,
)
out.append(scale_trans(d[None]))
ret: NdarrayOrTensor = torch.cat(out)
ret = convert_to_dst_type(ret, dst=img, dtype=self.dtype or img.dtype)[0]
return ret

I think this is a little more readable, the type ignore for factor may still be needed though.

factors: Sequence[float] | float,
fixed_mean: bool = True,
preserve_range: bool = False,
channel_wise: bool = False,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue here, despite the docstring argument being incorrectly included in this order.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

channel_wise option missing in transforms.RandScaleIntensityFixedMean

2 participants